- data statistical processing
- обработка данных статистическая
English-Russian glossary on space technology. 2015.
English-Russian glossary on space technology. 2015.
Data binning — is a data pre processing technique used to reduce the effects of minor observation errors. The original data values which fall in a given small interval, a bin, are replaced by a value representative of that interval, often the central value. It… … Wikipedia
Data sharing — is the practice of making data used for scholarly research available to other investigators. Replication has a long history in science. The motto of The Royal Society is Nullius in verba , translated Take no man s word for it. [1] Many funding… … Wikipedia
Data Protection Directive — The Data Protection Directive (officially Directive 95/46/EC on the protection of individuals with regard to the processing of personal data and on the free movement of such data) is a European Union directive which regulates the processing of… … Wikipedia
Data set — For IBM mainframe term for a file, see Data set (IBM mainframe). A data set (or dataset) is a collection of data, usually presented in tabular form. Each column represents a particular variable. Each row corresponds to a given member of the data… … Wikipedia
Statistical Lempel Ziv — is a concept of lossless data compression technique published by Dr. Sam Kwong and Yu Fan Ho in 2001.[1] It may be viewed as a variant of the Lempel Ziv (LZ) based method. The contribution of this concept is to include the statistical properties… … Wikipedia
Statistical semantics — is the study of how the statistical patterns of human word usage can be used to figure out what people mean, at least to a level sufficient for information access (Furnas, 2006). How can we figure out what words mean, simply by looking at… … Wikipedia
Statistical parametric mapping — or SPM is a statistical technique for examining differences in brain activity recorded during functional neuroimaging experiments using neuroimaging technologies such as fMRI or PET. It may also refer to a specific piece of software created by… … Wikipedia
Data Encryption Standard — The Feistel function (F function) of DES General Designers IBM First publis … Wikipedia
Statistical machine translation — (SMT) is a machine translation paradigm where translations are generated on the basis of statistical models whose parameters are derived from the analysis of bilingual text corpora. The statistical approach contrasts with the rule based… … Wikipedia
Statistical parsing — is a group of parsing methods within natural language processing. The methods have in common that they associate grammar rules with a probability. Grammar rules are traditionally viewed in computational linguistics as defining the valid sentences … Wikipedia
Data integrity — in its broadest meaning refers to the trustworthiness of system resources over their entire life cycle. In more analytic terms, it is the representational faithfulness of information to the true state of the object that the information represents … Wikipedia